11,733 research outputs found

    Simpler is better: a novel genetic algorithm to induce compact multi-label chain classifiers

    Get PDF
    Multi-label classification (MLC) is the task of assigning multiple class labels to an object based on the features that describe the object. One of the most effective MLC methods is known as Classifier Chains (CC). This approach consists in training q binary classifiers linked in a chain, y1 → y2 → ... → yq, with each responsible for classifying a specific label in {l1, l2, ..., lq}. The chaining mechanism allows each individual classifier to incorporate the predictions of the previous ones as additional information at classification time. Thus, possible correlations among labels can be automatically exploited. Nevertheless, CC suffers from two important drawbacks: (i) the label ordering is decided at random, although it usually has a strong effect on predictive accuracy; (ii) all labels are inserted into the chain, although some of them might carry irrelevant information to discriminate the others. In this paper we tackle both problems at once, by proposing a novel genetic algorithm capable of searching for a single optimized label ordering, while at the same time taking into consideration the utilization of partial chains. Experiments on benchmark datasets demonstrate that our approach is able to produce models that are both simpler and more accurate

    No-horizon theorem for spacetimes with spacelike G1 isometry groups

    Full text link
    We consider four-dimensional spacetimes (M,g)(M,{\mathbf g}) which obey the Einstein equations G=T{\mathbf G}={\mathbf T}, and admit a global spacelike G1=RG_{1}={\mathbb R} isometry group. By means of dimensional reduction and local analyis on the reduced (2+1) spacetime, we obtain a sufficient condition on T{\mathbf T} which guarantees that (M,g)(M,{\mathbf g}) cannot contain apparent horizons. Given any (3+1) spacetime with spacelike translational isometry, the no-horizon condition can be readily tested without the need for dimensional reduction. This provides thus a useful and encompassing apparent horizon test for G1G_{1}-symmetric spacetimes. We argue that this adds further evidence towards the validity of the hoop conjecture, and signals possible violations of strong cosmic censorship.Comment: 8 pages, LaTeX, uses IOP package; published in Class. Quantum Gra

    Distinguishing noise from chaos: objective versus subjective criteria using Horizontal Visibility Graph

    Get PDF
    A recently proposed methodology called the Horizontal Visibility Graph (HVG) [Luque {\it et al.}, Phys. Rev. E., 80, 046103 (2009)] that constitutes a geometrical simplification of the well known Visibility Graph algorithm [Lacasa {\it et al.\/}, Proc. Natl. Sci. U.S.A. 105, 4972 (2008)], has been used to study the distinction between deterministic and stochastic components in time series [L. Lacasa and R. Toral, Phys. Rev. E., 82, 036120 (2010)]. Specifically, the authors propose that the node degree distribution of these processes follows an exponential functional of the form P(Îș)∌exp⁥(−λ Îș)P(\kappa)\sim \exp(-\lambda~\kappa), in which Îș\kappa is the node degree and λ\lambda is a positive parameter able to distinguish between deterministic (chaotic) and stochastic (uncorrelated and correlated) dynamics. In this work, we investigate the characteristics of the node degree distributions constructed by using HVG, for time series corresponding to 2828 chaotic maps and 33 different stochastic processes. We thoroughly study the methodology proposed by Lacasa and Toral finding several cases for which their hypothesis is not valid. We propose a methodology that uses the HVG together with Information Theory quantifiers. An extensive and careful analysis of the node degree distributions obtained by applying HVG allow us to conclude that the Fisher-Shannon information plane is a remarkable tool able to graphically represent the different nature, deterministic or stochastic, of the systems under study.Comment: Submitted to PLOS On

    Scaling limits of a tagged particle in the exclusion process with variable diffusion coefficient

    Full text link
    We prove a law of large numbers and a central limit theorem for a tagged particle in a symmetric simple exclusion process in the one-dimensional lattice with variable diffusion coefficient. The scaling limits are obtained from a similar result for the current through -1/2 for a zero-range process with bond disorder. For the CLT, we prove convergence to a fractional Brownian motion of Hurst exponent 1/4.Comment: 9 page

    Cosmic homogeneity: a spectroscopic and model-independent measurement

    Get PDF
    Cosmology relies on the Cosmological Principle, i.e., the hypothesis that the Universe is homogeneous and isotropic on large scales. This implies in particular that the counts of galaxies should approach a homogeneous scaling with volume at sufficiently large scales. Testing homogeneity is crucial to obtain a correct interpretation of the physical assumptions underlying the current cosmic acceleration and structure formation of the Universe. In this Letter, we use the Baryon Oscillation Spectroscopic Survey to make the first spectroscopic and model-independent measurements of the angular homogeneity scale Ξh\theta_{\rm h}. Applying four statistical estimators, we show that the angular distribution of galaxies in the range 0.46 < z < 0.62 is consistent with homogeneity at large scales, and that Ξh\theta_{\rm h} varies with redshift, indicating a smoother Universe in the past. These results are in agreement with the foundations of the standard cosmological paradigm.Comment: 5 pages, 2 figures, Version accepted by MNRA
    • 

    corecore